I'm encountering an issue encoding/decoding a custom struct from SwiftData. As it's all happening behind the generated code of SwiftData and a decoder, I'm not really sure what's going on.
I have a custom type defined kind of like this:
public struct Group<Key: Hashable, Element: Hashable> {
private var elementGroups: [Element: Key]
private var groupedElements: [Key: [Element]]
}
In short, it allows multiple elements (usually a string), to be grouped, referenced by some key.
I have Codable conformance to this object, so I can encode and decode it. For simplicity, the elementGroups is encoded/decoded, and the groupedElements is rebuilt when decoding. My implementation is similar to this:
extension Group: Codable where Key: Codable, Element: Codable {
private enum Keys: CodingKey {
case groups
}
public init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: Keys.self)
let decoded = try container.decode([Element: Key].self, forKey: .groups)
// Enumerate over the element groups, and populate the list of elements.
//
var elements: [Key: [Element]] = [:]
for group in decoded {
elements[group.value] = (elements[group.value] ?? []) + [group.key]
}
elementGroups = decoded
groupedElements = elements
}
public func encode(to encoder: Encoder) throws {
var container = encoder.container(keyedBy: Keys.self)
try container.encode(elementGroups, forKey: .groups)
}
}
This works fine when encoding and decoding to JSON, but when I attempt to use this structure as a value within a SwiftData model, then decoding the type crashes my application.
@Model
final class MyModel {
var id: UUID = UUID()
var groups: Group<UUID, String> = Group<UUID, String>()
init(id: UUID) {
self.id = id
}
}
When something attempts to decode the groups property on the MyModel object, it crashes with the following error:
Could not cast value of type 'Swift.Optional<Any>' (0x1ed7d0290) to 'Swift.Dictionary<Swift.String, Foundation.UUID>' (0x121fa9448).
I would guess that there is a nil value stored for groups in SwiftData, and attempting to decode it to a Group<UUID, String> type is failing. It's odd that it doesn't throw an exception though, and hard crashes. Also, I'm not sure why it's optional, as a value is being written out.
Does anyone have any ideas?
Post
Replies
Boosts
Views
Activity
Is it possible to animate some property on a RealityKit component? For example, the OpacityComponent has an opacity property that allows the opacity of the entities it's attached to, to be modified. I would like to animate the property so the entity fades in and out.
I've been looking at the animation API for RealityKit and it either assumes the animation is coming from a USDZ (which this is not), or it allows properties of entities themselves to be animated using a BindTarget. I'm not sure how either can be adapted to modify component properties?
Am I missing something?
Thanks
I am trying to use a ShareLink to share multiple transferrable, and I cannot work out which of the initialisers to use - none seem to work.
Assuming I have a transferable that takes some data and processes it asynchronously:
struct MyTransferable: some Transferable {
let renderer: Renderer
static var transferRepresentation: some TransferRepresentation {
DataRepresentation(exportedContentType: .png) { transferable in
let image = try await transferable.render.render()
return image
}
}
}
In SwiftUI, I want to share N of these transferables. For example:
struct MyView: View {
private var transferables: [any Transferable] {
[MyTransferable(), MyTransferable()]
}
var body: some View {
ShareLink("Renders", items: transferables)
}
}
But the compiler doesn't like this - it complains with "No exact matches in call to initializer".
Is this possible? I feel like it should be?
I'm running into a crash on launch in Simulator with my visionOS that uses SwiftData in Xcode Version 15.3 (15E5202a).
This is printing to the console: SwiftData/DataUtilities.swift:1093: Fatal error: Unable to parse keypath for non-PersistentModel Type, and then it's crashing on _swift_runtime_on_report.
This worked fine in Xcode 15.2.
I'll return to Xcode 15.2 for now and report an issue, but is there anyone that can translate what this even means so I can fix it?
Thanks
We have an app for VisionOS that is available on the Vision Pro App Store. We are releasing the same app (same SKU) on the iOS App Store next month.
We would like to make the iOS version available for pre-order, so that when following marketing links associated for our new app on iPad/iPhone, the user will be redirected to the upcoming iOS app.
I've never done a pre-order before, but I can't find the option in App Store Connect.
Is this because we have already launched the app, even though we haven't launched it on the platform we want to do a pre-order for?
Is there anyway we can setup a pre-order for iOS, while the same app continues to exist in the visionOS App Store?
Thanks
I have a spherical HDR image that is being used for environment lighting in a SceneKit scene. I want to rotate the environment image.
To set the environment lighting, I use the lightingEnvironment SCNMaterialProperty. This works fine, and my scene is lit using the IBL.
As with all SCNMaterialProperty, I expect that I can use the contentsTransform property to rotate or transform the HDR. So I set it as follows:
lightingEnvironment.contentsTransform = SCNMatrix4MakeRotation((45.0).degreesAsRadians, 0.0, 1.0, 0.0)
My expectation is that the lighting environment would rotate 45 degrees in Y, but it doesn't change at all. Even if I throw in a completely random transform on all axis, there is no apparent change.
To test if there is a change, I added a chrome ball and a diffuse ball to my scene and I'm comparing reflections on the chrome ball, and lighting on the diffuse ball. There is no change on either.
It doesn't matter where I set the contentsTransform, it doesn't work. I had intended to set it from the renderer(_:updateAtTime:) method on the SCNRendererDelegate, so that I can rotate the IBL to match the point of view of the scene, but even if I transform the environment immediately after it is set, there is never a change.
Is this a bug? Or am I doing something entirely wrong? Has anyone on here ever managed to get this to work?
We have a content creation application that uses SceneKit for rendering. In our application, we have a 3D view (non-AR), and an AR "mode" the user can go into. Currently we use a SCNView and an ARSCNView to achieve this. Our application currently targets iOS and MacOS (with AR only on iOS).
With VisionOS on the horizon, we're trying to bring the tech stack up to date, as SceneKit no longer seems to be supported, and isn't supported at all on VisionOS.
We'd like to use RealityKit for 3D rendering on all platforms; MacOS, iOS and VisionOS, in non-AR and AR mode where appropriate.
So far this hasn't been too difficult. The greatest challenge has been adding gesture support to replace the allowsCameraControl option on the SCNView, as no such option on ARView.
However, now we get to control shading, we're hitting a bit of a roadblock. When viewing the scene in Non-AR mode, we would like to add a ground plane underneath the object that only displays a shadow - in other words, it's opacity would be determined by light contribution. I've had a dig through the CustomMaterial API and it seems extremely primitive - there doesn't seem any way to get light information for a particular fragment, unless I'm missing something?
Additionally, we support a custom shader that we can apply as materials. This custom shader allows the properties of the material to vary depending on the light contribution, light incidence angle...etc. Looking at the CustomMaterial, the API seems to be defining a CustomMaterial, whereas as guess we want to customise the BRDF calculation. We achieve this in SceneKit using a series of shader modifiers hooked into the various SCNShaderModifierEntryPoint.
On VisionOS of course the lack of support for CustomMaterial is a shame, but I would hope something similar can be achieved with RealityComposer?
We can live with the lack of custom material, but the shadow catcher is a killer for adoption for us. I'd even accept a different limited features on VisionOS, as long as we can matching our existing feature set on existing platforms.
What am I missing?
I'm building an iOS based app that has a RealityKit view filling the screen.
I would like to present a context menu at the location of my tap, so that I can present context sensitive controls for the piece of geometry I'm tapping on.
I managed to setup my context menu similar to this:
class MyARView: ARView, UIContextMenuInteractionDelegate {
required init(frame frameRect: CGRect) {
super.init(frame: frameRect)
setupContextMenu()
}
required dynamic init?(coder decoder: NSCoder) {
super.init(coder: decoder)
setupContextMenu()
}
override init(frame frameRect: CGRect, cameraMode: ARView.CameraMode, automaticallyConfigureSession: Bool) {
super.init(frame: frameRect, cameraMode: cameraMode, automaticallyConfigureSession: automaticallyConfigureSession)
setupContextMenu()
}
func isValidHitTest(at location: CGPoint) -> Bool {
// TODO: Implement...
}
func contextMenuInteraction(_ interaction: UIContextMenuInteraction, configurationForMenuAtLocation location: CGPoint) -> UIContextMenuConfiguration? {
guard isValidHitTest(at: location) else {
return nil
}
UIContextMenuConfiguration(
identifier: nil,
previewProvider: nil,
actionProvider: { _ in
let action = UIAction(title: "Menu Option") { _ in
print("Do the thing")
}
return UIMenu(title: "", children: [action])
}
)
}
func setupContextMenu() {
addInteraction(UIContextMenuInteraction(delegate: self))
}
}
This works technically, but it doesn't provide the behaviour I'd expect. If I tap on the view (anywhere), the context menu appears at the top right of the view, but never at the location of the tap. If I use a mouse with the iPad, and right click, it always appears under the mouse.
Is there any way I can configure a menu that will appear under the tap?
I don't mind if it's using UIContextMenuInteraction, and would actually prefer the visual style of UIEditMenuInteraction on iPad. But generally, as long as I can get a list of options to appear in a menu under the mouse, that's fine.
I have connected my iPad running iPadOS 17 to Xcode 15 via a Thunderbolt cable (the same connection I have used with Xcode 14 and earlier), and debugging has become a nightmare.
It's been stuck on the "Copying shared symbols from iPad" step for a long time, and I can't even get my iPhone to work with the wireless debugging (it always fails).
It appears to be using wireless debugging to connect, despite the physical cable between the devices. The "Connect via network" option is enabled in the Organisation window, but is greyed out so it cannot be modified.
How can I disable wireless debugging and return to the standard method using the high speed cable that connects the devices?
I'm hoping this is a bug that will be fixed shortly, and not the new method for connectivity when debugging.
Thanks.